Density EstimationΒΆ

  1. How to use normalizing flows for density estimation.

  2. Why you should use normalizing flows.

  3. Example: Anomaly detection

[1]:
from scipy.io import loadmat
lympho = loadmat("lympho.mat")
[6]:
import pandas as pd
from echoflow import EchoFlow

df = pd.DataFrame(lympho["X"])

model = EchoFlow(nb_blocks=8, block_type="RNVP")
model.fit(df)
Epoch 10 | Train Loss 0.069
Epoch 20 | Train Loss -4.492
Epoch 30 | Train Loss -7.895
Epoch 40 | Train Loss -9.233
Epoch 50 | Train Loss -11.101
Epoch 60 | Train Loss -13.374
Epoch 70 | Train Loss -15.335
Epoch 80 | Train Loss -15.898
Epoch 90 | Train Loss -17.371
Epoch 100 | Train Loss -17.292
Epoch 110 | Train Loss -18.211
Epoch 120 | Train Loss -18.836
Epoch 130 | Train Loss -19.149
Epoch 140 | Train Loss -18.746
Epoch 150 | Train Loss -18.942
Epoch 160 | Train Loss -19.928
Epoch 170 | Train Loss -19.493
Epoch 180 | Train Loss -18.708
Epoch 190 | Train Loss -18.327
Epoch 200 | Train Loss -20.113
Epoch 210 | Train Loss -19.527
Epoch 220 | Train Loss -19.837
Epoch 230 | Train Loss -19.331
Epoch 240 | Train Loss -19.364
Epoch 250 | Train Loss -17.352
Epoch 260 | Train Loss -18.578
Epoch 270 | Train Loss -17.830
Epoch 280 | Train Loss -16.697
Epoch 290 | Train Loss -20.727
Epoch 300 | Train Loss -19.390
Epoch 310 | Train Loss -19.648
Epoch 320 | Train Loss -19.635
Epoch 330 | Train Loss -20.148
Epoch 340 | Train Loss -17.182
Epoch 350 | Train Loss -18.876
Epoch 360 | Train Loss -17.321
Epoch 370 | Train Loss -19.846
Epoch 380 | Train Loss -18.767
Epoch 390 | Train Loss -19.472
Epoch 400 | Train Loss -20.134
Epoch 410 | Train Loss -19.232
Epoch 420 | Train Loss -19.583
Epoch 430 | Train Loss -18.567
Epoch 440 | Train Loss -20.473
Epoch 450 | Train Loss -16.148
Epoch 460 | Train Loss -18.441
Epoch 470 | Train Loss -19.966
Epoch 480 | Train Loss -19.316
Epoch 490 | Train Loss -17.581
Epoch 500 | Train Loss -19.473
Epoch 510 | Train Loss -19.499
Epoch 520 | Train Loss -17.684
Epoch 530 | Train Loss -18.907
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-6-15d848a3646d> in <module>
      5
      6 model = EchoFlow(nb_blocks=8, block_type="RNVP")
----> 7 model.fit(df)

~/Desktop/EchoFlow/echoflow/echoflow.py in fit(self, df, context)
    129             train_loss = []
    130             for _, (continuous, categorical, contexts) in enumerate(dataloader):
--> 131                 optimizer.zero_grad()
    132                 loss = torch.zeros(1)
    133

~/opt/anaconda3/envs/echoflow/lib/python3.8/site-packages/torch/optim/optimizer.py in zero_grad(self, set_to_none)
    186                         p.grad = None
    187                     else:
--> 188                         if p.grad.grad_fn is not None:
    189                             p.grad.detach_()
    190                         else:

~/opt/anaconda3/envs/echoflow/lib/python3.8/site-packages/torch/tensor.py in grad(self)
    945             return handle_torch_function(Tensor.grad.__get__, relevant_args, self)  # type: ignore[attr-defined]
    946
--> 947         if self.requires_grad and not hasattr(self, "retains_grad") and not self.is_leaf and self._grad is None:
    948             warnings.warn("The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad "
    949                           "attribute won't be populated during autograd.backward(). If you indeed want the gradient "

KeyboardInterrupt:
[ ]:
# TODO: compute likelihoods, rank by likelihoods, show that outliers are lower likelihood